Concerns regarding potential mass manipulation arise as new technologies use our data to predict and monetize on our behaviors. There are quite a few examples (links below).
“Uber redesigns app to predict where riders are headed and give them more to do in the car”
Facebook and Cambridge Analytica (Trailer)
Key Concepts:
Hypernudging: “drawing in Big Data to nudge individuals with personalized feedback to change their behavior” (Lanzing 2019, 550)
Self-tracking: “Fuelled by real-time data, algorithms create personalized online choice architectures* that aim to nudge individual users to effectively change their behavior.” (Lanzing 2019, 550)
*what is a choice architecture anyway
Main Goal: An ethical critique of self-tracking technologies and their data-collection-based algorithmic hypernudging.
Theoretical Framework: decisional privacy and informational privacy as complementary dimensions
Claim: self-tracking technologies and their hyper-nudging threatens individuals’ autonomy due to the fact that they violate both decisional and informational privacy.
Four Steps:
Nudging v.s. Hypernudging (1.1 - 1.4)
Re-evaluation of decisional privacy (1.5 -1.6)
Combination of informational privacy and decisional privacy (1.7 - 1.9)
Three potential objections (1.10)
Also Known As: “life-logging, quantified self, personal analytics, and personal informatics” (Lanzing 2019)
Definition: “the practice of quantifying behavior through extensive self-surveillance for the purpose of behavioral change” (Lanzing 2019)
How-to: wearable digital devices and/or smartphone apps + social media (external online platforms)
Examples:
Strava: self-tracking & social network for athletes
Other Fitness apps: FitBit, Runkeeper
“Big Data has enabled “personalized” choice architectures: choice architectures that are designed according to user data feedback. Personalized feedback in self-tracking is based on the analysis of large aggregates of (personal) information or “Big Data,” also referred to as personal analytics. The analysis aims to identify patterns and interesting correlations in the data. Based on the analysis, many devices, and apps make suggestions to their users about how they can change or improve their behavior, what choices to make.” (Lanzing 2019)
Main Attraction: personalized feedback and thus self-improvement (and empowerment)
Main Concern: Big Data-driven personalized choice architectures
These are some interesting articles I found regarding self-tracking, feel free to take a look around!
The Dark Side of Self-Tracking
‘The bot asked me four times a day how I was feeling’: is tracking everything actually good for us?
1 critique:
Potentially manipulative
“Manipulation, as I understand it here, refers to the intentional but “hidden” steering of people’s choices by promoting and shaping decision-making processes that persons generally would not use for making rational decisions (Wilkinson 2013: 347; Goodin 1980: 17).” (Lanzing 2019)
2
i.Do not compromise individual freedom and autonomy.
ii.Do not reduce/eliminate options, simply re-arranging the individual’s choice architecture so it favors specific options.
How Do Nudges Work?
Nudges influence and steer decision-making via psychological mechanisms.
What Are Bad Nudges?
i.Bad nudges rip the individual off their
ii.Bad nudges are
What Are Good Nudges?
i.Good nudges are transparent and straightforward.
ii.Good nudges are easy to opt out of.
iii.Good nudges intend to improve the welfare of the nudgee.
Also known as: “Big Data driven decision-guidance processes” “recommender systems.”
Definition: “the algorithmic real-time personalization and reconfiguration of choice architectures based on large aggregates of (personal) data.” (Lanzing 2019)
By constantly (re)-configuring and thereby personalizing the user’s informational choice context, typically through algorithmic analysis of data streams from multiple sources claiming to offer predictive insights concerning the habits, preferences, and interests of targeted individuals, these nudges channel users choices in directions preferred by the choice architect through processes that are subtle, unobtrusive, yet extraordinarily powerful (Yeung 2017: 119). (Lanzing 2019)
*Who is Yeung?
Data source:
Live data streams + Already stored data
Your personal data + All the data of other users
Examples:
Personalized Fitness App Motivations
Dynamic Pricing and Product Placement
Smart Home Energy Consumption Optimization
| Features | Nudges | Hypernudges |
|---|---|---|
| Dynamism (the real time, personalized feedback dynamic) | non-specific, aimed at the general public, “one size fits all” | one-to-many (millions!), real time, personalized, via surveillance and data collection |
| Predictive Capacity | no instant feedback and thus no immediate adjustment | real time algorithmic behavioral predictions and constant reconfiguration based on machine learning and instant feedback |
| Hiddenness and Hidden Intentions (the final, overarching, and most important) | tucked away but still detectable, “visible” in the physical world | well-hidden, ubiquitous, unobtrusive, essentially for corporate profit, driven by high-tech |
| A Good Nudge Criteria | A Nudge | A Hypernudge |
|---|---|---|
| Transparency | low yet detectable | unobtrusive thus misleading & unjustified |
| Easy to Opt out | could be | highly persuasive, well-hidden, system-default, opting out means quitting the service for good, and the option to opt out might not be clear (enough) |
| Welfare for the Nudgee | could benefit the nudgee | essentially for corporate benefit, potentially more aligned with corporate interests |
Definition: “the ability to control who has access to one’s personal information and to what extent (Westin 1967)” (Lanzing 2019)
Related Concept: reasonable expectations (dynamic and context dependent, constitute social norms)
Definition: “…the right against unwanted access such as unwanted interference in our decisions and actions (Allen 1988: 97; Roessler 2005: 9)” (Lanzing 2019)
Descriptions:
Narrow view: “nongovernmental decision-making”, intimate choices.
Broad view: fundamental life decisions, actions, modes of behavior, ways of life.
In Relation to Autonomy:
“…regulates the access of others in the form of interpretation, objection, commenting, and other forms of intervention in the way you live your life.” (Lanzing 2019)
“…provides the necessary breathing space to carry out one’s chosen life unhindered in different social contexts, which is important for leading a self-determined life and so, for autonomy (Roessler 2005: 80).” (Lanzing 2019)
“…protects you from the interference of others, from the “chilling effect”: conforming your actions to perceived social norms out of fear for (social) sanctions. ” (Lanzing 2019)
In Relation to Hypernudging:
Decisional privacy theories criticize how hypernudges use personal data to individuals decision-making process.
Decisional privacy concerns should therefore not be reduced to or merely understood in terms of informational privacy concerns. Decisional privacy could be a promising complementary conceptual tool for criticizing hypernudging. (Lanzing 2019)
Key Takeaway:
The violation of one’s decisional privacy does not entail an immediate loss of personal autonomy. People should be regarded as autonomous agents who are free to make decisions and receive certain interferences depending on the social norms and contexts, however, commercial choice architectures and hypernudges intrude personal autonomy by bringing in commercially driven interference in contexts where they are not welcome.
Interconnection:
Informational and decisional privacy are both closely related to the protection of individual autonomy as two mutually reinforcing dimensions.
Violations of informational privacy naturally translate to the violations of one’s decisional privacy.
i.People have no control over their data and data regarding their private decisions, nor the knowledge of how these data are used by commercial enterprises.
ii.Big data-driven corporations and commercial third parties are blurring the boundaries between contexts and norms of sharing and accessing personal information.
iii.Past behavioral data and data of an entire-population are used to hypernudge a specific individual, resulting in filter bubbles and collaborate filtering.
i.Hypernudges interfere people’s decision-making with hidden intentions and effects, people do not know why and how they are manipulated into making a choice that favors a third party.
ii.Not all options are visible and available to the nudgee, they are preselected by algorithms in favor of those in control of the technology.
iii.No accountability can be held. A nudgee does not know by whom their decisions are steered.
iv.Hypernudges affect many people in many domains at the same time.
O1: Some people are willingly sharing their data in order to gain more independence via self-tracking technologies.
Response: A technology that violates an individual’s informational and decisional privacy, and thus their autonomy, can not scaffold one’s autonomy.
O2: The fact that there are people using such technologies means that they consent to such operations and their effects.
Response: people consent to self-tracking technologies but their consents are no longer meaningful and informed.
O3: Simply not using self-tracking technologies seems to be the most effective strategy.
Response: Not everyone can afford to fully opt out, let alone self-tracking technologies are becoming institutionalized. This simply avoids the issue rather than solving it.
post the discussion question right here